A Family of Preconditioned Iteratively Regularized Methods For Nonlinear Minimization

نویسندگان

  • Alexandra Smirnova
  • Rosemary A Renaut
چکیده

The preconditioned iteratively regularized Gauss-Newton algorithm for the minimization of general nonlinear functionals was introduced by Smirnova, Renaut and Khan (2007). In this paper, we establish theoretical convergence results for an extended stabilized family of Generalized Preconditioned Iterative methods which includes M−times iterated Tikhonov regularization with line search. Numerical schemes illustrating the theoretical results are also presented.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Efficient Method for Large-Scale l1-Regularized Convex Loss Minimization

Convex loss minimization with l1 regularization has been proposed as a promising method for feature selection in classification (e.g., l1-regularized logistic regression) and regression (e.g., l1-regularized least squares). In this paper we describe an efficient interior-point method for solving large-scale l1-regularized convex loss minimization problems that uses a preconditioned conjugate gr...

متن کامل

A comparison of the computational performance of Iteratively Reweighted Least Squares and alternating minimization algorithms for ℓ1 inverse problems

Alternating minimization algorithms with a shrinkage step, derived within the Split Bregman (SB) or Alternating Direction Method of Multipliers (ADMM) frameworks, have become very popular for `-regularized problems, including Total Variation and Basis Pursuit Denoising. It appears to be generally assumed that they deliver much better computational performance than older methods such as Iterativ...

متن کامل

B-Preconditioned Minimization Algorithms for Variational Data Assimilation with the Dual Formulation

Variational data assimilation problems arising in meteorology and oceanography require the solution of a regularized nonlinear least-squares problem. Practical solution algorithms are based on the incremental (Truncated Gauss-Newton) approach, which involves the iterative solution of a sequence of linear least-squares (quadratic minimization) sub-problems. Each sub-problem can be solved using a...

متن کامل

Further convergence results on the general iteratively regularized Gauss-Newton methods under the discrepancy principle

We consider the general iteratively regularized Gauss-Newton methods xk+1 = x0 − gαk (F (xk)F (xk))F (xk) ( F (xk)− y − F (xk)(xk − x0) ) for solving nonlinear inverse problems F (x) = y using the only available noise yδ of y satisfying ‖yδ − y‖ ≤ δ with a given small noise level δ > 0. In order to produce reasonable approximation to the sought solution, we terminate the iteration by the discre...

متن کامل

Regularized fractional derivatives in Colombeau algebra

The present study aims at indicating the existence and uniqueness result of system in extended colombeau algebra. The Caputo fractional derivative is used for solving the system of ODEs. In addition, Riesz fractional derivative of  Colombeau generalized algebra is considered. The purpose of introducing Riesz fractional derivative is regularizing it in Colombeau sense. We also give a solution to...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008